Proximal Newton-Type Methods for Minimizing Composite Functions
نویسندگان
چکیده
We generalize Newton-type methods for minimizing smooth functions to handle a sum of two convex functions: a smooth function and a nonsmooth function with a simple proximal mapping. We show that the resulting proximal Newton-type methods inherit the desirable convergence behavior of Newton-type methods for minimizing smooth functions, even when search directions are computed inexactly. Many popular methods tailored to problems arising in bioinformatics, signal processing, and statistical learning are special cases of proximal Newton-type methods, and our analysis yields new convergence results for some of these methods. Technical report no. SOL 2013-1. Department of Management Science and Engineering, Stanford University, Stanford, California, May 31, 2013.
منابع مشابه
Finite Element Analysis of the Effect of Proximal Contour of Class II Composite Restorations on Stress Distribution
Introduction: The aim of this study was to evaluate the effect of proximal contour of class II composite restorations placed with straight or contoured matrix band using composite resins with different modulus of elasticity on stress distribution by finite element method. Methods: In order to evaluate the stress distribution of class II composite restorations using finite element method, upper ...
متن کاملProximal Newton-type methods for convex optimization
We seek to solve convex optimization problems in composite form: minimize x∈Rn f(x) := g(x) + h(x), where g is convex and continuously differentiable and h : R → R is a convex but not necessarily differentiable function whose proximal mapping can be evaluated efficiently. We derive a generalization of Newton-type methods to handle such convex but nonsmooth objective functions. We prove such met...
متن کاملConvergence analysis of inexact proximal Newton-type methods
We study inexact proximal Newton-type methods to solve convex optimization problems in composite form: minimize x∈Rn f(x) := g(x) + h(x), where g is convex and continuously differentiable and h : R → R is a convex but not necessarily differentiable function whose proximal mapping can be evaluated efficiently. Proximal Newton-type methods require the solution of subproblems to obtain the search ...
متن کاملGeneralized Self-Concordant Functions: A Recipe for Newton-Type Methods
We study the smooth structure of convex functions by generalizing a powerful concept so-called self-concordance introduced by Nesterov and Nemirovskii in the early 1990s to a broader class of convex functions, which we call generalized self-concordant functions. This notion allows us to develop a unified framework for designing Newton-type methods to solve convex optimization problems. The prop...
متن کاملAn Inexact Accelerated Proximal Gradient Method for Large Scale Linearly Constrained Convex SDP
The accelerated proximal gradient (APG) method, first proposed by Nesterov for minimizing smooth convex functions, later extended by Beck and Teboulle to composite convex objective functions, and studied in a unifying manner by Tseng, has proven to be highly efficient in solving some classes of large scale structured convex optimization (possibly nonsmooth) problems, including nuclear norm mini...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- SIAM Journal on Optimization
دوره 24 شماره
صفحات -
تاریخ انتشار 2014